161 research outputs found

    Modélisation procédurale de mondes virtuels par pavage d'occultation

    Get PDF
    Demonstration videos can be found on fr.linkedin.com/in/doriangomez/Cette thèse porte sur la modélisation procédurale de mondes virtuels étendus dans le domaine de l’informatique graphique. Nous proposons d’exploiter les propriétés de visibilité entre régions élémentaires de la scène, que nous appelons tuiles, pour contrôler sa construction par pavage rectangulaire. Deux objectifs distincts sont visés par nos travaux : (1) fournir aux infographistes un moyen efficace pour générer du contenu 3D pour ces scènes virtuelles de très grande taille, et (2) garantir, dès la création du monde, des performances de rendu et de visualisation efficace. Pour cela, nous proposons plusieurs méthodes de détermination de la visibilité en 2D et en 3D. Ces méthodes permettent l’évaluation d’ensembles potentiellement visibles (PVS) en temps interactif ou en temps réel. Elles sont basées sur les calculs de lignes séparatrices et de lignes de support des objets, mais aussi sur l’organisation hiérarchique des objets associés aux tuiles. La première technique (2D) garantit l’occultation complète du champ visuel à partir d’une distance fixe, spécifiée par le concepteur de la scène, depuis n’importe quel endroit sur le pavage. La seconde permet d’estimer et de localiser les tuiles où se propage la visibilité, et de construire le monde en conséquence. Afin de pouvoir générer des mondes variés, nous présentons ensuite l’extension de cette dernière méthode à la 3D. Enfin, nous proposons deux méthodes d’optimisation du placement des objets sur les tuiles permettant d’améliorer leurs propriétés d’occultation et leurs impacts sur les performances de rendu tout en conservant l’atmosphère créée par l’infographiste par ses choix de placement initiaux.This thesis deals with procedural modeling applied to extended worlds for computer graphics.We study visibility applied to tiling patterns, aiming at two distinct objectives : (1) providing artists with efficient tools to generate 3D content for very extended virtual scenes, and (2) guaranteeing that this content improves performance of subsequent renderings, during its construction. We propose several methods for 2D and 3D visibility determination, in order to achieve interactive or real-time evaluation of potentially visible sets (PVS). They are based on the concepts of separating and supporting lines/planes, as well as objects hierarchies over tiles. Our first 2D method guarantees full occlusion of the visual field (view frustum) beyond a fixed distance, regardless of the observer’s location on a tiling. The second method enables fast estimation and localization of visible tiles, and builds up a virtual world accordingly. We also extend this method to 3D. Finally, we present two methods to optimize objects locations on tiles, and show how to improve rendering performance for scenes generated on the fly

    Time and Space Coherent Occlusion Culling for Tileable Extended 3D Worlds

    Get PDF
    International audienceIn order to interactively render large virtual worlds, the amount of 3D geometry passed to the graphics hardware must be kept to a minimum. Typical solutions to this problem include the use of potentially visible sets and occlusion culling, however, these solutions do not scale well, in time nor in memory, with the size of a virtual world. We propose a fast and inexpensive variant of occlusion culling tailored to a simple tiling scheme that improves scalability while maintaining very high performance. Tile visibilities are evaluated with hardwareaccelerated occlusion queries, and in-tile rendering is rapidly computed using BVH instantiation and any visibility method; we use the CHC++ occlusion culling method for its good general performance. Tiles are instantiated only when tested locally for visibility, thus avoiding the need for a preconstructed global structure for the complete world. Our approach can render large-scale, diversified virtual worlds with complex geometry, such as cities or forests, all at high performance and with a modest memory footprint

    CMS distributed computing workflow experience

    Get PDF
    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation

    Simulation and Mechanistic Investigation of the Arrhythmogenic Role of the Late Sodium Current in Human Heart Failure

    Get PDF
    Heart failure constitutes a major public health problem worldwide. The electrophysiological remodeling of failing hearts sets the stage for malignant arrhythmias, in which the role of the late Na+ current (INaL) is relevant and is currently under investigation. In this study we examined the role of INaL in the electrophysiological phenotype of ventricular myocytes, and its proarrhythmic effects in the failing heart. A model for cellular heart failure was proposed using a modified version of Grandi et al. model for human ventricular action potential that incorporates the formulation of INaL. A sensitivity analysis of the model was performed and simulations of the pathological electrical activity of the cell were conducted. The proposed model for the human INaL and the electrophysiological remodeling of myocytes from failing hearts accurately reproduce experimental observations. The sensitivity analysis of the modulation of electrophysiological parameters of myocytes from failing hearts due to ion channels remodeling, revealed a role for INaL in the prolongation of action potential duration (APD), triangulation of the shape of the AP, and changes in Ca2+ transient. A mechanistic investigation of intracellular Na+ accumulation and APD shortening with increasing frequency of stimulation of failing myocytes revealed a role for the Na+/K+ pump, the Na+/Ca2+ exchanger and INaL. The results of the simulations also showed that in failing myocytes, the enhancement of INaL increased the reverse rate-dependent APD prolongation and the probability of initiating early afterdepolarizations. The electrophysiological remodeling of failing hearts and especially the enhancement of the INaL prolong APD and alter Ca2+ transient facilitating the development of early afterdepolarizations. An enhanced INaL appears to be an important contributor to the electrophysiological phenotype and to the dysregulation of [Ca2+]i homeostasis of failing myocytes

    A Glycemia Risk Index (GRI) of Hypoglycemia and Hyperglycemia for Continuous Glucose Monitoring Validated by Clinician Ratings

    Get PDF
    BackgroundA composite metric for the quality of glycemia from continuous glucose monitor (CGM) tracings could be useful for assisting with basic clinical interpretation of CGM data.MethodsWe assembled a data set of 14-day CGM tracings from 225 insulin-treated adults with diabetes. Using a balanced incomplete block design, 330 clinicians who were highly experienced with CGM analysis and interpretation ranked the CGM tracings from best to worst quality of glycemia. We used principal component analysis and multiple regressions to develop a model to predict the clinician ranking based on seven standard metrics in an Ambulatory Glucose Profile: very low-glucose and low-glucose hypoglycemia; very high-glucose and high-glucose hyperglycemia; time in range; mean glucose; and coefficient of variation.ResultsThe analysis showed that clinician rankings depend on two components, one related to hypoglycemia that gives more weight to very low-glucose than to low-glucose and the other related to hyperglycemia that likewise gives greater weight to very high-glucose than to high-glucose. These two components should be calculated and displayed separately, but they can also be combined into a single Glycemia Risk Index (GRI) that corresponds closely to the clinician rankings of the overall quality of glycemia (r = 0.95). The GRI can be displayed graphically on a GRI Grid with the hypoglycemia component on the horizontal axis and the hyperglycemia component on the vertical axis. Diagonal lines divide the graph into five zones (quintiles) corresponding to the best (0th to 20th percentile) to worst (81st to 100th percentile) overall quality of glycemia. The GRI Grid enables users to track sequential changes within an individual over time and compare groups of individuals.ConclusionThe GRI is a single-number summary of the quality of glycemia. Its hypoglycemia and hyperglycemia components provide actionable scores and a graphical display (the GRI Grid) that can be used by clinicians and researchers to determine the glycemic effects of prescribed and investigational treatments

    Juxtaposing BTE and ATE – on the role of the European insurance industry in funding civil litigation

    Get PDF
    One of the ways in which legal services are financed, and indeed shaped, is through private insurance arrangement. Two contrasting types of legal expenses insurance contracts (LEI) seem to dominate in Europe: before the event (BTE) and after the event (ATE) legal expenses insurance. Notwithstanding institutional differences between different legal systems, BTE and ATE insurance arrangements may be instrumental if government policy is geared towards strengthening a market-oriented system of financing access to justice for individuals and business. At the same time, emphasizing the role of a private industry as a keeper of the gates to justice raises issues of accountability and transparency, not readily reconcilable with demands of competition. Moreover, multiple actors (clients, lawyers, courts, insurers) are involved, causing behavioural dynamics which are not easily predicted or influenced. Against this background, this paper looks into BTE and ATE arrangements by analysing the particularities of BTE and ATE arrangements currently available in some European jurisdictions and by painting a picture of their respective markets and legal contexts. This allows for some reflection on the performance of BTE and ATE providers as both financiers and keepers. Two issues emerge from the analysis that are worthy of some further reflection. Firstly, there is the problematic long-term sustainability of some ATE products. Secondly, the challenges faced by policymakers that would like to nudge consumers into voluntarily taking out BTE LEI

    Penilaian Kinerja Keuangan Koperasi di Kabupaten Pelalawan

    Full text link
    This paper describe development and financial performance of cooperative in District Pelalawan among 2007 - 2008. Studies on primary and secondary cooperative in 12 sub-districts. Method in this stady use performance measuring of productivity, efficiency, growth, liquidity, and solvability of cooperative. Productivity of cooperative in Pelalawan was highly but efficiency still low. Profit and income were highly, even liquidity of cooperative very high, and solvability was good

    Search for stop and higgsino production using diphoton Higgs boson decays

    Get PDF
    Results are presented of a search for a "natural" supersymmetry scenario with gauge mediated symmetry breaking. It is assumed that only the supersymmetric partners of the top-quark (stop) and the Higgs boson (higgsino) are accessible. Events are examined in which there are two photons forming a Higgs boson candidate, and at least two b-quark jets. In 19.7 inverse femtobarns of proton-proton collision data at sqrt(s) = 8 TeV, recorded in the CMS experiment, no evidence of a signal is found and lower limits at the 95% confidence level are set, excluding the stop mass below 360 to 410 GeV, depending on the higgsino mass
    corecore